New Three-Term Conjugate Gradient Method with Exact Line Search

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Gradient Projection Method with Exact Line Search

The gradient projection algorithm for function minimization is often implemented using an approximate local minimization along the projected negative gradient. On the other hand, for some difficult combinational optimization problems, where a starting guess may be far from a solution, it may be advantageous to perform a nonlocal (exact) line search. In this paper we show how to evaluate the pie...

متن کامل

A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search

A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes–Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition gT kdk ≤ − 7 8 ‖gk‖. Moreover, a global convergence result is establis...

متن کامل

Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search

and Applied Analysis 3 = ‖d k−1 ‖2 ‖g k−1 ‖4 + 1 ‖g k ‖2 − β2 k (gT k d k−1 ) 2 /‖g k ‖4

متن کامل

Some global convergence properties of the Wei-Yao-Liu conjugate gradient method with inexact line search

In [Z.Wei, S. Yao, L. Liu, The convergence properties of some new conjugate gradient methods, Applied Mathematics and Computation 183 (2006) 1341–1350], Wei et al. proposed a new conjugate gradient method called WYL method which has good numerical experiments and some excellent properties such as b k P 0. In this paper, we prove that while tk 6 1 c 2L kgkk kdkk , the sufficient descent conditio...

متن کامل

A Conjugate Gradient Method with Strong Wolfe-Powell Line Search for Unconstrained Optimization

In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test probl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: MATEMATIKA

سال: 2020

ISSN: 0127-9602,0127-8274

DOI: 10.11113/matematika.v36.n3.1214